AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Whole Word Masking Technique

# Whole Word Masking Technique

Clinical Pubmed Bert Base 512
MIT
A BERT model pre-trained on PubMed abstracts and further trained on MIMIC-III clinical notes for clinical decision support
Large Language Model Transformers English
C
Tsubasaz
27
5
Chinese Macbert Base
Apache-2.0
MacBERT is an improved BERT model that uses a novel MLM as a corrective masked language model pre-training task, alleviating the discrepancy between pre-training and fine-tuning stages.
Large Language Model Chinese
C
hfl
22.48k
132
Bert Large Uncased Whole Word Masking Squad2 With Ner Mit Restaurant With Neg With Repeat
This model is a fine-tuned version of bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and mit_restaurant datasets, supporting token classification tasks.
Sequence Labeling Transformers English
B
andi611
18
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase